52 research outputs found

    Probabilistic Robustness Analysis of Stochastic Jump Linear Systems

    Full text link
    In this paper, we propose a new method to measure the probabilistic robustness of stochastic jump linear system with respect to both the initial state uncertainties and the randomness in switching. Wasserstein distance which defines a metric on the manifold of probability density functions is used as tool for the performance and the stability measures. Starting with Gaussian distribution to represent the initial state uncertainties, the probability density function of the system state evolves into mixture of Gaussian, where the number of Gaussian components grows exponentially. To cope with computational complexity caused by mixture of Gaussian, we prove that there exists an alternative probability density function that preserves exact information in the Wasserstein level. The usefulness and the efficiency of the proposed methods are demonstrated by example.Comment: 2014 ACC(American Control Conference) pape

    Geodesic Density Tracking with Applications to Data Driven Modeling

    Full text link
    Many problems in dynamic data driven modeling deals with distributed rather than lumped observations. In this paper, we show that the Monge-Kantorovich optimal transport theory provides a unifying framework to tackle such problems in the systems-control parlance. Specifically, given distributional measurements at arbitrary instances of measurement availability, we show how to derive dynamical systems that interpolate the observed distributions along the geodesics. We demonstrate the framework in the context of three specific problems: (i) \emph{finding a feedback control} to track observed ensembles over finite-horizon, (ii) \emph{finding a model} whose prediction matches the observed distributional data, and (iii) \emph{refining a baseline model} that results a distribution-level prediction-observation mismatch. We emphasize how the three problems can be posed as variants of the optimal transport problem, but lead to different types of numerical methods depending on the problem context. Several examples are given to elucidate the ideas.Comment: 8 pages, 7 figure

    Wasserstein Consensus ADMM

    Full text link
    We introduce Wasserstein consensus alternating direction method of multipliers (ADMM) and its entropic-regularized version: Sinkhorn consensus ADMM, to solve measure-valued optimization problems with convex additive objectives. Several problems of interest in stochastic prediction and learning can be cast in this form of measure-valued convex additive optimization. The proposed algorithm generalizes a variant of the standard Euclidean ADMM to the space of probability measures but departs significantly from its Euclidean counterpart. In particular, we derive a two layer ADMM algorithm wherein the outer layer is a variant of consensus ADMM on the space of probability measures while the inner layer is a variant of Euclidean ADMM. The resulting computational framework is particularly suitable for solving Wasserstein gradient flows via distributed computation. We demonstrate the proposed framework using illustrative numerical examples
    • …
    corecore